Learning Statistics By Doing Statistics

Gary Smith
Pomona College

Journal of Statistics Education v.6, n.3 (1998)

Copyright (c) 1998 by Gary Smith, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.


Key Words: Activity-based learning; Introductory statistics; Learning by doing; Team projects.

Abstract

To help students develop statistical reasoning, a traditional introductory statistics course was modified to incorporate a semester-long sequence of projects, with written and oral reports of the results. Student test scores improved dramatically, and students were overwhelmingly positive in their assessment of this new approach.

1. Introduction

1 A radical reform of introductory statistics classes has been advocated by many, often motivated by observations similar to Hogg (1991): "students frequently view statistics as the worst course taken in college." Some statisticians believe that the goals of an introductory statistics course should be redirected from mathematical technique to data analysis. Others advocate changes in pedagogy, replacing passively received lectures with hands-on activities.

2 For most of my 25 years teaching statistics, I have felt that my role is to give clear lectures that transmit information from the expert (me) to the novices (students). This narcissistic perspective is flattering, but, I believe, misplaced. The focus should not be on the professor, but rather on the students. The proper question is not, How can I make my lectures more brilliant?, but rather, How can I help my students learn statistics?

3 Several authors have recommended laboratory-based courses, in-class activities, or course-long projects. To allow students to learn statistics by doing statistics, I now supplement my lectures with a sequence of biweekly out-of-class group projects with written and oral reports of the results. The student response has been overwhelmingly positive, converting me from a skeptic to a believer.

2. Working With Data

4 Many statisticians, including Bradstreet (1996) and Cobb (1991), argue that statistical reasoning should take precedence over statistical methods. Hogg (1991) wrote that, "At the beginning level, statistics should not be presented as a branch of mathematics. Good statistics is not equated with mathematical purity or rigor but is more closely associated with careful thinking."

5 Statistical reasoning is not an irrelevant abstraction. To demonstrate the power, elegance, and even beauty of statistical reasoning, realistic examples from a wide variety of disciplines can persuade students that they are learning critical-thinking skills that can be applied every day and in almost any career. It is important that the motivating examples be real. Students are more easily convinced of the power of statistical reasoning if they see it applied to questions that are interesting and real to them. Tools that are used to answer artificial questions will seem artificial too. In addition, students will remember a real-world question and how we answered it more easily than they will remember a contrived example.

6 The problem with relying on examples done by others is that students remain passive participants and do not experience firsthand the many issues that arise in data collection and analysis. As Hogg (1991) wrote: "Instead of asking students to work on 'old' data, even though real, is it not better to have them find or generate their own data? Projects give students experience in asking questions, defining problems, formulating hypotheses and operational definitions, designing experiments and surveys, collecting data and dealing with measurement error, summarizing data, analyzing data, communicating findings, and planning 'follow-up' experiments suggested by the findings." Similarly, Snee (1993) wrote that the "collection and analysis of data is at the heart of statistical thinking. Data collection promotes learning by experience and connects the learning process to reality."

3. Learning By Doing

7 One way to help students develop their statistical reasoning is to incorporate active-learning strategies that allow students to supplement what they have heard and read about statistics by actually doing statistics -- designing studies, collecting data, analyzing their results, preparing written reports, and giving oral presentations.

8 In arguing for experiential learning, Snee (1993) quotes the Chinese proverb, "I hear, I forget. I see, I remember. I do, I understand." Bradstreet (1996) writes that, "Learning is situated in activity. Students who use the tools of their education actively rather than just acquire them build an increasingly rich implicit understanding of the world in which they use the tools and of the tools themselves."

9 Many have proposed ways of actively engaging students in hands-on data collection. Bradstreet (1996) recommends a laboratory-based course. Others endorse in-class activities (Dietz 1993; Gnanadesikan, Scheaffer, Watkins, and Witmer 1997); a single three-week project (Hunter 1977); or a course-long project (Chance 1997; Fillebrown 1994; Ledolter 1995; Mackisack 1994). Nonetheless, Cobb (1993) summarized a dozen NSF grants intended to improve the teaching of statistics, and found that none involved student projects collecting and analyzing data.

10 If it is true that students learn by doing, then a series of out-of-class projects involving a variety of data and statistical tools may be more beneficial than in-class activities that must use very restricted kinds of data and do not allow sustained planning and analysis, or one long project that uses only one or two statistical tools. I consequently now use a semester-long sequence of projects to involve the students in hands-on data collection and analysis. This strategy allows more depth than in-class activities and more variety than course-long projects. To enhance their experiential learning, students prepare written and oral reports of their results.

11 Garfield (1993) discusses the details for implementing cooperative-learning strategies, and researchers have reported success from cooperative practices in introductory statistics classes (Dietz 1993; Jones 1991; Keeler and Steinhorst 1995; Shaughnessy 1977), though most of their cooperative activities are limited to homework and studying. In order to encourage teamwork (and reduce the grading burden), I divide the students in my class of thirty into three-person teams. With six biweekly projects, each member of a three-person team can do two written reports and two oral presentations of the project results. Large statistics classes can be divided into somewhat larger teams, with fewer reports per person. The use of teams fosters cooperative learning, develops team-working skills, and often builds considerable camaraderie.

12 I allow students to form their own three-person teams or partial two-person teams, and I complete the team assignments by literally drawing names out of a hat -- a physical demonstration of random selection! In practice, most students do not know each other well enough to form their own teams, but those who are friends seem to appreciate the opportunity to be teammates. Unless there are irreconcilable differences, the teams stay together all semester.

13 To discourage free riding, I announce at the beginning of the semester that I will ask each student at the end of the semester to evaluate each teammate's contribution to the team projects, and that I may adjust a student's project grades based on these reports. This evaluation form states: "Ideally, each team member will do an equal amount of work on team projects, helping others as much as the others help him or her. Please circle the comment that seems most apt for this person: does more than the other team members put together; helps us more than we help him or her; does a third of the team work, a fair share; contributes, but not as much as the other team members; contributes little or nothing to the team."

14 At the beginning of the semester, each team is given six biweekly projects to do over the course of the semester, each with a specific due date. The projects are staggered so that, in any given week, half of the teams are completing projects. The sequence of projects matches the coverage of the course material: a topic covered in class one week shows up in the projects due the following week.

15 Some projects involve a search of the library or the World Wide Web for data, some a polling of students or professors, and some data collected in other ways (for example, free throws by basketball players). The Appendix describes 20 of the 60 projects that I assigned in the spring of 1998 and (in brackets) the anticipated type of data, analysis, and results. One team's assignment in the spring of 1998 consisted of projects 1, 2, 3, 11, 15, and 19. Many of the projects were inspired by student term papers from previous semesters, others from conversations with colleagues. I have used many projects more than once; others have been replaced or rotated as I've expanded my collection of possible projects.

16 With my prior approval, teams can modify or replace any of the projects; in practice, projects are seldom changed. The team members can discuss their project with me or the teaching assistant; we do not help collect data, but we will answer questions about project design and data analysis.

17 To keep the workload reasonable, I no longer assign a semester-long term paper, and I have reduced the number of homework exercises each week from ten to five. We cover the same material in the same or similar textbook (Smith 1991, then Smith 1998), as my intent is to use projects not to teach different topics, but to aid student learning.

4. Learning By Writing

18 We can often improve and test our understanding of a subject by writing about it. For instance, some of my best students rewrite their lecture notes as essays in their own words. The process of writing about a subject can clarify and reinforce understanding. In addition, by attempting to rewrite the lecture in their own words, they can learn what parts of the lecture they don't really understand and consequently need to figure out.

19 In a statistics class, written reports of project results can be used as nontrivial writing assignments that help students to learn statistics, improve their writing skills, and overcome the preconception that statistics is just plug-and-chug. In my class, the team as a whole is responsible for the project analysis and implementation, but individual team members take turns writing the project reports. Thus each project receives two grades, an analysis grade that is given to every team member and a writing grade that is given to the report's author. During the course of the semester, each student writes two project reports. From the professor's viewpoint, two brief project reports are less burdensome to grade than a semester-long term paper.

20 I grade each report not only on the analysis, but also on whether the writing is clear, persuasive, and grammatically correct. When I hand out the initial project assignments, I include a page of instructions for preparing the reports and a page of grammar reminders. The instructions include this advice:

The purpose of your report is to explain your project's objectives, how you obtained your data, the inferences you draw from your data, and any reservations you have about your conclusions. It should be fair, honest, and interesting -- the kind of report that you would find informative and enjoy reading.

Any relevant data that you use should be included as an appendix to your report; this appendix can be handwritten as long as it is clear, clean, and readable. Data that are used in the body of your report should be presented clearly and effectively in tables or graphs.

The body of your report must be typed and should use clear, concise, and persuasive prose. It should be long enough to make the points you are trying to make, but not so long that the reader becomes bored. Not counting appendixes, tables, and graphs, I will be surprised if your report has fewer than 3 or more than 5 pages.

You can use the first person and a conversational tone ("We settled in at the library and started flipping pages, looking for the right data and trying not to be distracted by articles about OJ Simpson."), but don't be sloppy or use excessive slang ("We finally found the libary and some mags, but were bummed by all all the OJ arcticles."). I will deduct points for typographical and grammatical errors. I will add points for writing that holds the reader's attention while still being serious, not silly.

21 The grammar reminders include these tips:

22 In class, I say forcefully that statistics can be used to inform, but we need to communicate that information effectively. Excess words and passive constructions are particularly deadly. I suggest that once students have what they consider a final draft, they read it one more time looking for unnecessary words that can be trimmed. I also strongly recommend that the team members help one another by reading each other's reports, as peer evaluations often help both the author and the reader (Stromberg and Ramanathan 1996).

23 I post a few especially well-written papers on the course web site. Students seem to appreciate these very concrete examples of what the professor values, and to take pride in having their papers selected as model reports.

5. Learning By Speaking

24 Many students graduate from college having had no instruction or practice in public speaking, and, indeed, harboring a deep dread of having to speak to an audience. When asked, five or ten years after graduation, what they wished they had learned in college, to speak effectively and without fear is often near the top of the list.

25 I consequently now ask students to make brief oral presentations (five minutes maximum) of their project results. To encourage teamwork, the person who gives the oral report cannot be the person who authored the written report. These presentations can utilize handouts, an overhead projector, and other visual aids.

26 In a class of thirty, ten teams making biweekly oral reports take up about 20 to 30 minutes of classroom time a week. My hope is that these lost lecture minutes will be more than offset by the hours students spend outside class learning statistics by working on their projects.

27 The fear of the acute embarrassment that would result from an inaccurate, disorganized, or incoherent oral presentation provides a tremendous motivation to prepare adequately. In addition, attempts to express concepts in our own words can help us understand these ideas better and retain them longer. Writing assignments are one way to do this; speaking assignments are another. Oral reports can not only help students develop the ability to speak coherently and persuasively, but can also help them learn statistics.

28 I try to give students a few tips on how to be effective speakers. I tell them that we are all prone to nervous habits (fiddling with a button, putting a hand in a pocket, saying "um") that distract listeners and signal the speaker's nervousness. Speakers are usually unaware of these habits, and one of our jobs as a supportive classroom audience is to alert them to these problems. I also tell students that audiences have more confidence in speakers who don't rely much on notes: someone who reads a speech may be just reciting what someone else wrote. A memorized speech can have the same effect. The goal is to give an extemporaneous speech that tells the audience that the speaker knows the material and is expressing it in his or her own words. I also advise students to have lots of eye contact with individual members of the audience, instead of looking at notes, the floor, or the back of the room.

29 Just as the development of good writing skills requires useful feedback, so does the development of good speaking skills. At the conclusion of each presentation, I give the class a few moments to write down constructive suggestions, which I collect and give to the speaker at the end of class. If everyone says "slow down" or "speak up," the speaker will know this is a serious problem. This exercise also encourages everyone to think about what works and what doesn't. I make written suggestions too and grade each oral presentation. One enlightening practice is to write especially popular phrases (such as "um" and "basically") at the top of the page and tabulate how many times these are used by the student.

30 The use of a sequence of projects gives students an opportunity to improve their speaking abilities after receiving this feedback. Obviously, people cannot become effective speakers by giving two oral presentations, any more than they could become effective authors by writing two papers. Instead, these oral presentations should be viewed as opportunities to nurture and develop skills that will be honed over a lifetime.

31 Because thinking on one's feet is an important objective, I tell students to ask each speaker challenging questions. If the questions lag, I fire away. Even "dumb" questions can be useful, as they force the speaker to explain things differently and perhaps more clearly. Although they are not rewarded directly for asking questions, students find spirited exchanges among the speaker and the audience to be not only beneficial, but a great deal of fun. I have also noticed that students tend to ask tougher questions of the more accomplished or arrogant speakers and to take it easy on those who are struggling.

6. Authentic Assessment

32 Writing and speaking assignments are more difficult to grade than traditional computational questions, particularly since the latter can often be handled by multiple-choice tests. However, a student's success in meeting a course's objectives should be measured by authentic assessment techniques (Chance 1997; Garfield 1993). If we want students to understand and communicate statistical results, then their course grade should depend substantially on how well they do so. Cobb (1993) writes that, "Once we accept that assessment must be authentic, the most radical implication of TQM is that the entire course should be built of assessment tasks."

33 I do not go this far, but 40% of the course grade is now based on the team projects, 15% on homework exercises, 15% on the midterm, and 30% on the final examination. This compromise is intended to communicate to students the importance of the team projects, while also signaling that the projects are a means to an end -- that I expect them to learn the traditional material and will test them on that material in a traditional way.

34 For the team projects, each student accumulates 10 separate grades during the semester, each worth 4% of the course grade: six team grades on project design and analysis; two individual grades on written reports, and two individual grades on oral reports. Each of these is a letter grade (A, A-, B+, and so on), which is converted to a numerical equivalent when the overall course grade is calculated.

7. Does It Work?

35 The first time that I incorporated team projects into a statistics course, I gave the students an anonymous survey on the last class day that was collected by another student and given to me after the semester grades had been recorded. There were four questions on this survey. The first was as follows:

This semester, I am trying an experiment. I reduced the number of homework exercises each week from 10 to 5 and eliminated the semester term paper. Instead, the class has been divided into 3-person teams to work on 6 biweekly projects, with each person responsible for 2 written essays and 2 oral presentations. Please advise me about how this experiment worked and how it might be improved. Do not sign your name and please be honest and candid.

Overall, I feel that this new format is

a terrible idea           a bad idea           a good idea           a great idea

Of the 30 students taking the class, 6 said it was a good idea and 24 said it was a great idea.

36 The other questions were open-ended: "What I like most about this course is," "What I like least about this course is," and "Suggestions for improving this course." All of the answers were positive and gratifying. The projects were mentioned often in the answers to the second question. The third and fourth questions were either left blank or answered positively. Here are some of the comments:

37 Examination scores also improved dramatically. Classes are not random samples, but I did not announce the change in format ahead of time and there was no apparent reason for a systematic change in the students who enrolled the first time that I taught the course with projects. The tests were of similar difficulty and covered the same chapters in the same textbook. Comparisons are now more problematic because I have since changed textbooks (from Smith 1991 to Smith 1998) and students know of the new format when they register for the course.

38 The last semester that I taught the course without team projects, the scores on the midterm examination had a mean of 80.79 and a standard deviation of 16.00, and the scores on the final examination had a mean of 80.27 and a standard deviation of 12.56. These are very close to the average scores in this class for the past 10 years.

39 Taught with projects, the scores on the midterm had a mean of 92.13 and a standard deviation of 6.96. This is the highest average score that I have ever had in a statistics class. I tried to make the final examination tougher in order to decompress the scores, but was only partly successful. The mean was 88.12 and the standard deviation 8.28.

40 Figure 1 shows boxplots of the test scores. Particularly noteworthy is the less-frequent occurrence of very low scores. In addition, no one dropped the project-based course; the previous semester, two students (a typical number) dropped the course after the midterm.




Figure 1 (3.2K gif)

Figure 1. Boxplots for Two Adjacent Semesters.


41 Heartened by these test scores and the student enthusiasm, I am persuaded that a sequence of team projects with written reports and oral presentations is a promising approach that may help students learn statistics and also to write and speak more effectively. Students appear to be aware of these benefits and to appreciate the opportunity to develop these skills.


Appendix: Twenty Projects

  1. Go to a local grocery store and collect these data for at least 75 breakfast cereals: cereal name; grams of sugar per serving; and the shelf location (bottom, middle, or top). Group the data by shelf location and use three boxplots to compare the sugar content by shelf location. [Observational data; using boxplots to summarize data, can also be used for ANOVA test; high-sugar cereals are often at child-eye height.]

  2. Use computer software to simulate 1,000 flips of a fair coin. Record the fraction of the flips that were heads after 10, 100, and 1,000 flips. Repeat this experiment 100 times and then use three histograms to summarize your results. [Simulation data; using histograms to summarize data; demonstrates central limit theorem and effect of sample size on standard deviation.]

  3. Estimate the average number of hours that students at this school sleep each day, including both nighttime sleep and daytime naps. Also estimate the percentage who have been up all night without sleeping at least once during the current semester. [Survey data; confidence intervals for quantitative and qualitative data; students sleep less than 8 hours and many have all-nighters; if done at the beginning and end of the term, the differences are as expected.]

  4. Estimate and compare the average words per sentence in People, Time, and New Republic. [Observational data; confidence interval with quantitative data; the order given is from fewest words to most; New Republic has some outlier sentences with close to 100 words.]

  5. Estimate the percentage of the seniors at this college who regularly read a daily newspaper, the percentage who can name the two U.S. senators from their home state, the percentage who are registered to vote, and the percentage who would almost certainly vote if a presidential election were held today. [Survey data; confidence intervals for qualitative data; far more students are registered and will vote than read a newspaper or can name their senators.]

  6. Conduct a taste test of either Coke versus Pepsi or Diet Coke versus Diet Pepsi. Survey at least 50 randomly selected students who identify themselves beforehand as cola drinkers with a definite preference for one of the brands you are testing. Give each subject a cup of each cola that has been coded in a way known only to you. Calculate the fraction of your sample whose choice in the taste test matches the brand identified beforehand as their favorite. (Do not tell your subjects that this is a test of their ability to identify their favorite brand; tell them it is a test of which tastes better.) Determine the two-sided p-value for a test of the null hypothesis that there is a 0.5 probability that a cola drinker will choose his or her favorite brand. [Experimental data; hypothesis test using binomial model; most students prefer Coke, but neither group is very successful at identifying its favorite.]

  7. Find five avid basketball players and ask each of them to shoot 100 free throws. Do not tell them the purpose of this exercise, which is to determine if a missed free throw is equally likely to bounce to the same or opposite side as their shooting hand. Use your data for each of these players to calculate the two-sided p-value for testing the null hypothesis that a missed free throw by this player is equally likely to bounce to either side. [Experimental data; hypothesis test using binomial model; coaches often say that the ball will bounce to shooting-hand side, but the data are unpersuasive.]

  8. Ask 50 female students these four questions: Among female students at this college, is your height above average or below average? Is your weight above average or below average? Is your intelligence above average or below average? Is your physical attractiveness above average or below average? Ask 50 male students these same questions (in comparison to male students at this college). Try to design a survey procedure that will ensure candid answers. For each gender and each question, test the null hypothesis that p = 0.5. [Survey data; hypothesis test using binomial model; most males think that they are above average.]

  9. Young children who play sports are often separated by age. In 1991, for example, children born in 1984 might have been placed in a 7-year-old league while children born in 1983 were placed in an 8-year-old league. Someone born in January 1984 is eleven months older than someone born in December 1984. Because coaches give more attention and playing time to better players, children with early birth dates may have an advantage when they are young that cumulates over the years. To test this theory, look at a professional sport and see how many players have birth dates during the first six months of the year. [Observational data; hypothesis test using binomial model; seems to be true.]

  10. College students are said to experience the Frosh 15 -- an average weight gain of 15 pounds during their first year at college. Test this folklore by asking at least 100 randomly selected students how much weight they gained or lost during their first year at college. Determine the two-sided p-value for testing the null hypothesis that the population mean is a 15-pound gain, and also determine a 95 percent confidence interval for the population mean. [Survey data; hypothesis test using t distribution; strongly rejected (is it a myth or do students misreport?).]

  11. What percentage of the seniors at your college expect to be married within five years of graduation? What percentage expect to have children within five years of graduation? How many biological children do the seniors at your college expect to have during their lives? Do males and females differ in their answer to these questions? [Survey data; two-sample test; few expect to be married or have children within five years of graduation; males plan to have slightly more children; if possible, a comparison with alumni records is interesting.]

  12. Ask a random sample of at least 50 students the following question: "During the school year, how many hours a week do you spend, on average, on school-related work -- for example, reading books, attending class, doing homework, and writing papers?" Ask a random sample of at least 25 professors this question: "During the school year, how many hours a week do you spend, on average, on school-related work -- for example, preparing lectures, teaching, grading, advising, serving on committees, and doing research?" Determine the p-value for a test at the 5 percent level of the null hypothesis that the two population means are equal. [Survey data; two-sample test; professors work twice as many hours as students.]

  13. Ask at least 100 randomly selected college students to write down their grade point average (GPA) and to indicate where they typically sit in large classrooms: in the very front, towards the front, in the middle, towards the back, or in the very back. If feasible, restrict your sample to students who are taking the same class or similar classes. Use an ANOVA F-test to see if the differences in the GPAs among these five categories are statistically significant at the 5 percent level. [Survey data; ANOVA; no statistically persuasive patterns.]

  14. Ask randomly selected college students if they have had a serious romantic relationship in the past two years and, if so, to identify the month in which the most recent relationship began. When you have found 120 students who answer yes and can identify the month, make a chi-square test of the null hypothesis that each month is equally likely for the beginning of a romantic relationship. [Survey data; chi-square test; the start of each term is a popular time for romance.]

  15. Ask 50 randomly selected students this question and then compare the male and female responses: "You have a coach ticket for a nonstop flight from Los Angeles to New York. Because the flight is overbooked, randomly selected passengers will be allowed to sit in open first-class seats. You are the first person selected. Would you rather sit next to: (a) the U.S. president; (b) the president's wife; or (c) Michael Jordan? [Survey data; chi-square test; females choose the president's wife, males the president.]

  16. For each of the 50 states, calculate Bill Clinton's percentage of the total votes cast for the Democratic and Republican presidential candidates in 1992; do not include votes for other candidates. Do the same for the 1996 election. Is there a statistical relationship between these two sets of data? Are there any apparent outliers or anomalies? [Observational data; simple regression; extremely strong correlation with a few anomalies.]

  17. Select an automobile model and year (at least three years old) that is of interest to you -- for example, a 1993 Saab 900S convertible. Now find at least 30 of these cars that for sale (either from dealers or private owners) and record the odometer mileage (x) and asking price (y). As best you can, try to keep the cars as similar as possible. For example, ignore the car color, but do not mix together 4-cylinder and 6-cylinder cars or manual and automatic transmissions. Estimate the equation y = a + bx + e and summarize your results. [Observational data; simple linear regression; good fit with reasonable coefficients and interesting outliers.]

  18. Pick a date and approximate time of day (for example, 10:00 in the morning on April 1) for scheduling nonstop flights from an airport near you to at least a dozen large U.S. cities. Determine the cost of a coach seat on each of these flights and the distance covered by each flight. Use your data to estimate a simple linear regression model with ticket cost the dependent variable and distance the explanatory variable. Are there any outliers? [Observational data; simple linear regression; good fit with reasonable coefficients and interesting outliers.]

  19. Go to a large bookstore that has a prominent display of best-selling fiction and nonfiction hardcover books. For each of these two categories, record the price and number of pages for at least ten books. Use these data to estimate a multiple regression model with price the dependent variable and three explanatory variables: a dummy variable that equals 0 if nonfiction and 1 if fiction, the number of pages, and the dummy variable multiplied by the number of pages. Are there any apparent outliers in your data? [Observational data; multiple regression; good fit with reasonable coefficients and interesting outliers.]

  20. Ask 100 randomly selected students to estimate their height and the heights of both of their biological parents. Also note the gender of each student in your sample. Now estimate a multiple regression model with the student's height as the dependent variable and the student's gender, mother's height, and father's height as the explanatory variables. [Survey data; multiple regression; good fit with reasonable coefficients and evidence of regression toward the mean.]

References

Bradstreet, T. E. (1996), "Teaching Introductory Statistics Courses So That Nonstatisticians Experience Statistical Reasoning", The American Statistician, 50, 69-78.

Chance, B. L. (1997), "Experiences with Authentic Assessment Techniques in an Introductory Statistics Course," Journal of Statistics Education, [Online], 5(3). (http://www.amstat.org/publications/jse/v5n3/chance.html)

Cobb, G. W. (1991), "Teaching Statistics: More Data, Less Lecturing," Amstat News, December, No. 182, 1 and 4.

----- (1993), "Reconsidering Statistics Education: A National Science Foundation Conference," Journal of Statistics Education, [Online], 1(1). (http://www.amstat.org/publications/jse/v1n1/cobb.html)

Dietz, E. J. (1993), "A Cooperative Learning Activity on Methods of Selecting a Sample," The American Statistician, 47, 104-108.

Fillebrown, S. (1994), "Using Projects in an Elementary Statistics Course for Non-Science Majors," Journal of Statistics Education, [Online], 2(2). (http://www.amstat.org/publications/jse/v2n2/fillebrown.html)

Garfield, J. (1993), "Teaching Statistics Using Small-Group Cooperative Learning," Journal of Statistics Education, [Online], 1(1). (http://www.amstat.org/publications/jse/v1n1/garfield.html)

Gnanadesikan, M., Scheaffer, R. L., Watkins, A. E., and Witmer, J. A. (1997), "An Activity-Based Statistics Course," Journal of Statistics Education, [Online], 5(2). (http://www.amstat.org/publications/jse/v5n2/gnanadesikan.html)

Hogg, R. V. (1991), "Statistical Education: Improvements Are Badly Needed," The American Statistician, 45, 342-343.

Hunter, W. G. (1977), "Some Ideas About Teaching Design of Experiments, with 2^5 Examples of Experiments Conducted by Students," The American Statistician, 31, 12-17.

Jones, L. (1991), "Using Cooperative Learning to Teach Statistics," Research Report Number 91-2, The L. L. Thurstone Psychometric Laboratory, University of North Carolina.

Keeler, C. M., and Steinhorst, R. K. (1995), "Using Small Groups to Promote Active Learning in the Introductory Statistics Course: A Report from the Field," Journal of Statistics Education, [Online], 3(2). (http://www.amstat.org/publications/jse/v3n2/keeler.html)

Ledolter, J. (1995), "Projects in Introductory Statistics Courses," The American Statistician, 49, 364-367.

Mackisack, M. (1994), "What Is the Use of Experiments Conducted By Statistics Students?," Journal of Statistics Education, [Online], 2(1). (http://www.amstat.org/publications/jse/v2n1/mackisack.html)

Shaughnessy, J. M. (1977), "Misconceptions of Probability: An Experiment With a Small-Group Activity-Based Model Building Approach to Introductory Probability at the College Level," Educational Studies in Mathematics, 8, 285-315.

Snee, R. D.(1993), "What's Missing in Statistical Education?", The American Statistician, 47, 149-154.

Smith, G. (1991), Statistical Reasoning (3rd ed.), Boston: Allyn and Bacon.

----- (1998), Introduction to Statistical Reasoning, Boston: WCB/McGraw-Hill.

Stromberg, A. J., and Ramanathan, S. (1996), "Easy Implementation of Writing in Introductory Statistics Courses, The American Statistician, 50, 159-163.


Gary Smith
Pomona College
Claremont, CA 91711

gsmith@pomona.edu


Return to Table of Contents | Return to the JSE Home Page